95 research outputs found

    On the second eigenvalue of random bipartite biregular graphs

    Full text link
    We consider the spectral gap of a uniformly chosen random (d1,d2)(d_1,d_2)-biregular bipartite graph GG with ∣V1∣=n,∣V2∣=m|V_1|=n, |V_2|=m, where d1,d2d_1,d_2 could possibly grow with nn and mm. Let AA be the adjacency matrix of GG. Under the assumption that d1β‰₯d2d_1\geq d_2 and d2=O(n2/3),d_2=O(n^{2/3}), we show that Ξ»2(A)=O(d1)\lambda_2(A)=O(\sqrt{d_1}) with high probability. As a corollary, combining the results from Tikhomirov and Youssef (2019), we confirm a conjecture in Cook (2017) that the second singular value of a uniform random dd-regular digraph is O(d)O(\sqrt{d}) for 1≀d≀n/21\leq d\leq n/2 with high probability. This also implies that the second eigenvalue of a uniform random dd-regular digraph is O(d)O(\sqrt{d}) for 1≀d≀n/21\leq d\leq n/2 with high probability. Assuming d2=O(1)d_2=O(1) and d1=O(n2)d_1=O(n^2), we further prove that for a random (d1,d2)(d_1,d_2)-biregular bipartite graph, ∣λi2(A)βˆ’d1∣=O(d1(d2βˆ’1))|\lambda_i^2(A)-d_1|=O(\sqrt{d_1(d_2-1)}) for all 2≀i≀n+mβˆ’12\leq i\leq n+m-1 with high probability. The proofs of the two results are based on the size biased coupling method introduced in Cook, Goldstein, and Johnson (2018) for random dd-regular graphs and several new switching operations we defined for random bipartite biregular graphs.Comment: 37 pages, 3 figures. Corollary 1.4 added, a few typo fixe

    A non-backtracking method for long matrix and tensor completion

    Full text link
    We consider the problem of low-rank rectangular matrix completion in the regime where the matrix MM of size nΓ—mn\times m is ``long", i.e., the aspect ratio m/nm/n diverges to infinity. Such matrices are of particular interest in the study of tensor completion, where they arise from the unfolding of a low-rank tensor. In the case where the sampling probability is dmn\frac{d}{\sqrt{mn}}, we propose a new spectral algorithm for recovering the singular values and left singular vectors of the original matrix MM based on a variant of the standard non-backtracking operator of a suitably defined bipartite weighted random graph, which we call a \textit{non-backtracking wedge operator}. When dd is above a Kesten-Stigum-type sampling threshold, our algorithm recovers a correlated version of the singular value decomposition of MM with quantifiable error bounds. This is the first result in the regime of bounded dd for weak recovery and the first for weak consistency when dβ†’βˆžd\to\infty arbitrarily slowly without any polylog factors. As an application, for low-rank orthogonal kk-tensor completion, we efficiently achieve weak recovery with sample size O(nk/2)O(n^{k/2}), and weak consistency with sample size Ο‰(nk/2)\omega(n^{k/2})

    Overparameterized random feature regression with nearly orthogonal data

    Full text link
    We investigate the properties of random feature ridge regression (RFRR) given by a two-layer neural network with random Gaussian initialization. We study the non-asymptotic behaviors of the RFRR with nearly orthogonal deterministic unit-length input data vectors in the overparameterized regime, where the width of the first layer is much larger than the sample size. Our analysis shows high-probability non-asymptotic concentration results for the training errors, cross-validations, and generalization errors of RFRR centered around their respective values for a kernel ridge regression (KRR). This KRR is derived from an expected kernel generated by a nonlinear random feature map. We then approximate the performance of the KRR by a polynomial kernel matrix obtained from the Hermite polynomial expansion of the activation function, whose degree only depends on the orthogonality among different data points. This polynomial kernel determines the asymptotic behavior of the RFRR and the KRR. Our results hold for a wide variety of activation functions and input data sets that exhibit nearly orthogonal properties. Based on these approximations, we obtain a lower bound for the generalization error of the RFRR for a nonlinear student-teacher model.Comment: 39 pages. A condition on the activation function is added in Assumption 2.

    Global eigenvalue fluctuations of random biregular bipartite graphs

    Full text link
    We compute the eigenvalue fluctuations of uniformly distributed random biregular bipartite graphs with fixed and growing degrees for a large class of analytic functions. As a key step in the proof, we obtain a total variation distance bound for the Poisson approximation of the number of cycles and cyclically non-backtracking walks in random biregular bipartite graphs, which might be of independent interest. As an application, we translate the results to adjacency matrices of uniformly distributed random regular hypergraphs.Comment: 45 pages, 5 figure
    • …
    corecore